Non-Asymptotic Theory of Random Matrices Lecture 16: Invertibility of Gaussian Matrices and Compressible/Incompressible Vectors

نویسنده

  • Matthew Herman
چکیده

We begin this lecture by asking why should an arbitrary n × n Gaussian matrix A be invertible? That is, does there exist a lower bound on the smallest singular value s n (A) = inf x∈S n−1 Ax 2 ≥ c √ n where c > 0 is an absolute constant. There are two reasons (or cases) which we will pursue in this lecture. 1. In Lecture 15 we saw that the invertibility of rectangular (i.e., non-square) Gaussian matrices yields invertibility of A for all sparse vectors. Specifically, we derived Sparse Lemma 5 which stated that There exists an absolute constant δ ∈ (0, 1) such that with probability 1 − e −cn inf x∈S n−1 (δn)-sparse Ax 2 ≥ c √ n. (1) 2. Suppose that the rows X 1 ,. .. , X n of A are " very " linearly independent. This has a geometric interpretation as we saw the Distance Lemma of Lecture 14. Let the hyperplane H k = span(X j) j =k. Then P(dist(X k , H k) < ε) ∼ ε and it follows that Ax 2 ≥ max k |x k | · dist(X k , H k) (2) for all x ∈ S n−1. This yields invertibility of A for all spread vectors, e.g., if |x k | ∼ 1 √ n , then Ax 2 ≥ 1 √ n · const.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-Asymptotic Theory of Random Matrices

Let A be an n × n subgaussian matrix (entries are i.i.d. subgaussian r.v’s with variance 1). There are two reasons for the invertibility of A, depending on the nature of the unit vector on which A is acting – either compressible or incompressible. We recall that compressible vectors are those whose distance is at most some constant ρ from the set of (δn)-sparse vectors, and incompressible vecto...

متن کامل

Non-Asymptotic Theory of Random Matrices Lecture 18: Strong invertibility of subgaussian matrices and Small ball probability via arithmetic progression

1 Strong invertibility of subgaussian matrices In the last lecture, we derived an estimate for the smallest singular value of a subgaussian random matrix; Theorem 1. Let A be a n × n subgaussian matrix. Then, for any > 0, P(s n (A) < ε √ n) ≤ cε + Cn −1 2 (1) In particular, this implies s n (A) ∼ 1 √ n with high probability. However, (1) cannot show P(s n (A) < ε √ n) → 0 as → 0 because of the ...

متن کامل

Introduction to the non-asymptotic analysis of random matrices

2 Preliminaries 7 2.1 Matrices and their singular values . . . . . . . . . . . . . . . . . . 7 2.2 Nets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Sub-gaussian random variables . . . . . . . . . . . . . . . . . . . 9 2.4 Sub-exponential random variables . . . . . . . . . . . . . . . . . . 14 2.5 Isotropic random vectors . . . . . . . . . . . . . . . . . . . . . . ...

متن کامل

Statistical Mechanics and Random Matrices

Statistical Mechanics and Random Matrices 3 1. Introduction 6 2. Motivations 7 3. The different scales; typical results 12 Lecture 1. Wigner matrices and moments estimates 15 1. Wigner's theorem 16 2. Words in several independent Wigner matrices 23 3. Estimates on the largest eigenvalue of Wigner matrices 25 Lecture 2. Gaussian Wigner matrices and Fredholm determinants 27 1. Joint law of the ei...

متن کامل

On the asymptotic eigenvalue distribution of concatenated vector-valued fading channels

The linear vector-valued channel + with and denoting additive white Gaussian noise and independent random matrices, respectively, is analyzed in the asymptotic regime as the dimensions of the matrices and vectors involved become large. The asymptotic eigenvalue distribution of the channel’s covariance matrix is given in terms of an implicit equation for its Stieltjes transform as well as an exp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007